4 research outputs found

    Tools for expressive gesture recognition and mapping in rehearsal and performance

    Get PDF
    Thesis (S.M.)--Massachusetts Institute of Technology, School of Architecture and Planning, Program in Media Arts and Sciences, 2010.Cataloged from PDF version of thesis.Includes bibliographical references (p. 97-101).As human movement is an incredibly rich mode of communication and expression, performance artists working with digital media often use performers' movement and gestures to control and shape that digital media as part of a theatrical, choreographic, or musical performance. In my own work, I have found that strong, semantically-meaningful mappings between gesture and sound or visuals are necessary to create compelling performance interactions. However, the existing systems for developing mappings between incoming data streams and output media have extremely low-level concepts of "gesture." The actual programming process focuses on low-level sensor data, such as the voltage values of a particular sensor, which limits the user in his or her thinking process, requires users to have significant programming experience, and loses the expressive, meaningful, and metaphor-rich content of the movement. To remedy these difficulties, I have created a new framework and development environment for gestural control of media in rehearsal and performance, allowing users to create clear and intuitive mappings in a simple and flexible manner by using high-level descriptions of gestures and of gestural qualities. This approach, the Gestural Media Framework, recognizes continuous gesture and translates Laban Effort Notation into the realm of technological gesture analysis, allowing for the abstraction and encapsulation of sensor data into movement descriptions. As part of the evaluation of this system, I choreographed four performance pieces that use this system throughout the performance and rehearsal process to map dancers' movements to manipulation of sound and visual elements. This work has been supported by the MIT Media Laboratory.by Elena Naomi Jessop.S.M

    Transitioning Between Audience and Performer: Co-Designing Interactive Music Performances with Children

    Full text link
    Live interactions have the potential to meaningfully engage audiences during musical performances, and modern technologies promise unique ways to facilitate these interactions. This work presents findings from three co-design sessions with children that investigated how audiences might want to interact with live music performances, including design considerations and opportunities. Findings from these sessions also formed a Spectrum of Audience Interactivity in live musical performances, outlining ways to encourage interactivity in music performances from the child perspective

    Capturing the Body Live: A Framework for Technological Recognition and Extension of Physical Expression in Performance

    No full text
    Performing artists have frequently used technology to sense and extend the body’s natural expressivity through live control of multimedia. However, the sophistication, emotional content and variety of expression possible through the original physical channels are often not captured by these technologies and thus cannot be transferred from body to digital media. In this article the author brings together research from expressive performance analysis, machine learning and technological performance extension techniques to define a new framework for recognition and extension of expressive physical performance
    corecore